classical physics
Scalars are universal: Equivariant machine learning, structured like classical physics
There has been enormous progress in the last few years in designing neural networks that respect the fundamental symmetries and coordinate freedoms of physical law. Some of these frameworks make use of irreducible representations, some make use of high-order tensor objects, and some apply symmetry-enforcing constraints. Different physical laws obey different combinations of fundamental symmetries, but a large fraction (possibly all) of classical physics is equivariant to translation, rotation, reflection (parity), boost (relativity), and permutations. Here we show that it is simple to parameterize universally approximating polynomial functions that are equivariant under these symmetries, or under the Euclidean, Lorentz, and Poincaré groups, at any dimensionality $d$. The key observation is that nonlinear O($d$)-equivariant (and related-group-equivariant) functions can be universally expressed in terms of a lightweight collection of scalars---scalar products and scalar contractions of the scalar, vector, and tensor inputs. We complement our theory with numerical examples that show that the scalar-based method is simple, efficient, and scalable.
Scalars are universal: Equivariant machine learning, structured like classical physics
There has been enormous progress in the last few years in designing neural networks that respect the fundamental symmetries and coordinate freedoms of physical law. Some of these frameworks make use of irreducible representations, some make use of high-order tensor objects, and some apply symmetry-enforcing constraints. Different physical laws obey different combinations of fundamental symmetries, but a large fraction (possibly all) of classical physics is equivariant to translation, rotation, reflection (parity), boost (relativity), and permutations. Here we show that it is simple to parameterize universally approximating polynomial functions that are equivariant under these symmetries, or under the Euclidean, Lorentz, and Poincaré groups, at any dimensionality d . The key observation is that nonlinear O( d)-equivariant (and related-group-equivariant) functions can be universally expressed in terms of a lightweight collection of scalars---scalar products and scalar contractions of the scalar, vector, and tensor inputs.
Quantum Computing Is Coming. What Can It Do?
Quantum technology is approaching the mainstream. Goldman Sachs recently announced that they could introduce quantum algorithms to price financial instruments in as soon as five years. Honeywell anticipates that quantum will form a $1 trillion industry in the decades ahead. But why are firms like Goldman taking this leap -- especially with commercial quantum computers being possibly years away? To understand what's going on, it's useful to take a step back and examine what exactly it is that computers do.
- Banking & Finance (0.56)
- Automobiles & Trucks (0.36)
- Aerospace & Defense (0.36)
- Information Technology > Hardware (1.00)
- Information Technology > Artificial Intelligence (1.00)
Quantum Computing Is Coming. What Can It Do?
Quantum technology is approaching the mainstream. Goldman Sachs recently announced that they could introduce quantum algorithms to price financial instruments in as soon as five years. Honeywell anticipates that quantum will form a $1 trillion industry in the decades ahead. But why are firms like Goldman taking this leap -- especially with commercial quantum computers being possibly years away? To understand what's going on, it's useful to take a step back and examine what exactly it is that computers do.
- Banking & Finance (0.56)
- Automobiles & Trucks (0.36)
- Aerospace & Defense (0.36)
- Information Technology > Hardware (1.00)
- Information Technology > Artificial Intelligence (1.00)
New research indicates the whole universe could be a giant neural network
The core idea is deceptively simple: every observable phenomenon in the entire universe can be modeled by a neural network. And that means, by extension, the universe itself may be a neural network. Vitaly Vanchurin, a professor of physics at the University of Minnesota Duluth, published an incredible paper last August entitled "The World as a Neural Network" on the arXiv pre-print server. It managed to slide past our notice until today when Futurism's Victor Tangermann published an interview with Vanchurin discussing the paper. We discuss a possibility that the entire universe on its most fundamental level is a neural network.
- North America > United States > Minnesota > St. Louis County > Duluth (0.26)
- North America > United States > Minnesota > Saint Louis County > Duluth (0.26)
What Is Quantum Machine Learning And How Can It Help Us? - Liwaiwai
Artificial intelligence refers, among other things, to machines' capacity to demonstrate some degree of what humans consider "intelligence". This process is being driven by the rapid advancement of machine learning: getting machines to think for themselves rather than pre-programming them with an absolute concept. Humans excel at this task, but it's proved difficult to simulate artificially. Training a machine to recognise a cat doesn't mean inputting a set definition of what a cat looks like. Instead, many different images of cats are inputted; the aim is that the computer learns to distil the underlying "cat-like" pattern of pixels.